Submit

ai-memory — Persistent Memory for Any AI

@AlphaOne LLC

Persistent memory for any AI assistant. Zero token cost until recall. Stores memories locally in SQLite, ranks by 6-factor scoring, returns results in TOON compact format (79% smaller than JSON). 17 MCP tools, 20 HTTP endpoints, 25 CLI commands. 4 tiers from keyword to autonomous with local LLMs via Ollama. 97.8% recall accuracy on ICLR 2025 LongMemEval benchmark. Works with Claude, ChatGPT, Grok, Cursor, Windsurf, Continue.dev, OpenClaw, Llama, and any MCP client. Open source, MIT licensed, written in Rust.
Overview

ai-memory

If that works, paste this block into the left pane:

ai-memory — Persistent Memory for Any AI

Zero token cost until recall. Built-in memory systems inject 200+ lines into every message. ai-memory uses zero context tokens until memory_recall is
called. Only relevant memories return, ranked by 6-factor scoring. TOON format responses are 79% smaller than JSON.

ICLR 2025 LongMemEval Benchmark

97.8% R@5 (489/500) · 99.8% R@20 (499/500) · 232 q/s · $0 cloud cost

17 MCP Tools

memory_store · memory_recall · memory_search · memory_list · memory_get · memory_update · memory_delete · memory_promote · memory_forget · memory_link · memory_get_links · memory_consolidate · memory_capabilities · memory_expand_query · memory_auto_tag · memory_detect_contradiction · memory_stats

4 Tiers (all local, no cloud)

keywordsemantic (default) → smart (Ollama) → autonomous (Ollama)

8 AI Platforms

Claude · Codex · Gemini · Cursor · Windsurf · Continue.dev · Grok · Llama

Install

cargo install ai-memory

Rust + SQLite + FTS5 + HNSW. 161 tests. MIT licensed.

Server Config

{
  "mcpServers": {
    "memory": {
      "command": "ai-memory",
      "args": [
        "--db",
        "~/.claude/ai-memory.db",
        "mcp",
        "--tier",
        "semantic"
      ]
    }
  }
}
© 2025 MCP.so. All rights reserved.

Build with ShipAny.